Some early observations on the Google December core update
Some of these observations are based on the quality raters guidelines, the past core updates, and sound SEO advice.
What is a core update?
Google makes changes to their search algorithms on a daily basis. A few times a year, they release significant changes to their core search algorithms and systems that are much more noticeable. In Google’s own words, core updates are, “designed to ensure that overall, [Google is] delivering on [their] mission to present relevant and authoritative content to searchers.”
If your website’s traffic has declined following a Google core update, most likely Google’s algorithms have determined that there are other pages on the web that are more relevant and helpful than yours. It can be frustrating to SEO’s to not know how Google does this.
Google’s documentation on How Search Works describes the steps taken to return relevant results to searchers:
1. Organize the content on the web: As Google crawls the web, they organize pages in an index. They take note of key signals on each page such as which keywords are on the page, how up to date the page is, and more.
2. Determine the meaning of the searcher’s query: In order to understand which pages to recommend to a searcher, Google needs to understand the meaning of each query. Algorithms determine whether a query is looking for fresh, new content or not. Some words in a query may be easy for Google to decipher. Google’s algorithms can now do a good job at understanding whether a searcher is looking for a single, distinct answer in which Google could present them a fact from the Knowledge graph, or perhaps they are doing research where they would like to read more thorough information and peruse Google’s organic results.
3. Determining which pages are the most helpful to return: Once Google understands the intent behind a query, their goal is to return web pages that are the most helpful to answer this query.
But how does Google determine which pages are the best to return to a searcher? Google has a blog post dedicated to explaining core updates, while offering advice to site owners.. Much of our methodology in diagnosing the cause for a traffic drop is based on the items described in this post.
The Quality raters guidelines can give us clues about Google updates
In the past, many Google updates would have a very clear and obviously discernible focus. Sites affected by the early updates of Google’s Penguin algorithm generally had problems with low quality spammy link building. Sites affected by Google’s Panda algorithm, another algorithm released many years ago, were usually ones that would be easy to identify as having large amounts of thin, unhelpful content.
Core updates usually do not have one single and obvious focus. If your site was negatively affected there is rarely a single smoking gun to blame.
The good news is that we can get some clues as to what improvements Google wants to make to search by studying changes made to Google’s Quality Raters’ Guidelines (QRG). When these guidelines update, we pay attention!
If a Google engineer is writing code to improve the algorithm, they will present the Quality Raters with two sets of search results to review. One is the results as they currently exist when a keyword is searched. The second is what the results would look like once the engineer’s proposed changes to the algorithm are implemented. The raters then evaluate the search results based on their knowledge of the QRG and their feedback is given to the engineer.
Sometimes the QRG can give us clues as to what Google engineers are working on changing in Google’s algorithms. For example, in the summer of 2018, just prior to the August 1, core update, Google modified the QRG to add the words “safety of users” when describing YMYL pages:
August 1, 2018 “Medic” update. Many medical and nutritional sites that had either serious reputation issues, a lack of real life expertise, or other untrustworthy characteristics had huge reductions in rankings.
Similarly, the June 3, 2019 Google core update had a strong impact on alternative medical sites that contradicted scientific consensus. Again, this was in line with what we see in the QRG:
recently, in October of 2020, we set out to see what changes Google made as these likely give us clues to determine what Google search engineers are trying to accomplish in future updates. The most obvious addition to the QRG was that Google added several examples to explain to the raters how to determine whether a page would meet a searcher’s needs. As we discussed in our article on understanding user intent the raters are shown an example of a page that could be returned for the query, “how many octaves on a guitar”. The raters are told the page itself is high quality, and has medium to high E-A-T. However, because the page discusses octaves on a piano and not a guitar, which was what the search query was about, the raters are instructed to mark this page as “Fails to meet” when it comes to “needs met”.
Similarly, in the QRG, a Wikipedia article on ATM machines was deemed to be a page with high E-A-T, but not one that meets the needs of a searcher who typed in “ATM”. That searcher would not be looking for a Wikipedia article, but rather, likely wants to know where the closest ATM is.
It is our belief that if Google made a point of showing the Quality Raters examples of pages that had good E-A-T, but still were not the best page to meet a searcher’s needs, then we would see this reflected in a future update.
We suspected that if Google is going to be working on surfacing pages that do a good job in terms of “needs met” and that this meant they were leaning heavily on Natural Language Processing, and in particular BERT, in order to understand language.
With all of this said, however, we were disappointed to hear Danny Sullivan say that the December core update did not have anything to do directly with BERT.
We thought that Google was using BERT, and in conjunction with BERT, other frameworks such as the SMITH model or BigBird, each of which allow search engines to analyze even longer chunks of text than BERT to ascertain whether the text truly is the answer a searcher is looking for. It’s possible that this is still happening…perhaps it has been happening for a while now. We have speculated in the past that the unannounced November 8, 2019 update marked some type of change to Google’s use of NLP.
As we continue to analyze sites that won and lost in this update, and in particular pages that improved or declined for particular keyword searches, one pattern stands out to us and it’s a hard one to succinctly decipher. What we are seeing is that in most cases, the change Google made really did seem to help surface more relevant and helpful results. But it’s often hard to explain why.
This may be why John Mueller of Google responded to me on Twitter with a quote from the Little Prince when I mentioned we were digging in to try and figure out this update!
Examples of sites affected by the December core update
As we do not want to share the private data of our clients, much of what you see below is taken from a list of sites that have been publicly identified as winners or losers of this update in this article by Lily Ray.
DrAxe.com
This site has been discussed a lot in SEO circles since their drastic hit after the August 1, 2018 core update. The majority of the site discusses alternative medical topics. According to data from Semrush, the site saw gains on many pages following the December core update.
at this page that saw nice improvements across many keywords.
Ahrefs data tells us that this page had greatly improved rankings for many keywords.
Google video, they shared that in the past they have improved their search results by putting greater emphasis on authority over relevance in some cases. This makes sense as a whitepaper on how Google fights disinformation tells us that in times of crisis Google may choose to prefer authority over relevance and other ranking factors. In 2020, the world is certainly in a time of crisis.
post on core updates is, “Does the headline and/or page title provide a descriptive, helpful summary of the content?”
3) Google seems to be doing a better job of surfacing content that is relevant to a query. Even though Danny Sullivan said that the December Core update was not directly related to BERT, the one pattern we can see across most, if not all keywords that changed in our client base is that Google did well in surfacing helpful pages that did well to meet the needs of the searcher.
4) Is UX a factor? We did not discuss this in this article, but after reading this article by Kevin Indig in which he notes that many sites that declined with the update had a horrible ad experience, we are paying more attention to this as well! If you declined with this update and have a large number of ads, especially ads that interfere with a user’s ability to read the main content, it may be worth experimenting with showing fewer ads. But know that if Google did change something with this core update in regards to ads, you may need to wait until we have another core update in order to see improvements.
5) A few other things. We are also currently investigating whether this update had a stronger than normal impact on cryptocurrency sites. This is challenging to assess however, as Bitcoin has had renewed strength in the last couple of weeks. Similarly, we are investigating whether user generated content has now been given more value as well, as many of our clients that did well with this update, saw improvements in keyword rankings for their forum pages or other pages with lots of user generated content. Not all of these clients were medical sites.
It is difficult to give specific recovery advice, given that there is rarely a single culprit to blame once a site has been negatively affected by a core update. Our approach is always to do all we can to improve the site as based on the information in Google’s Quality Raters’ Guidelines and also Google’s Blog Post on Core updates.
Here are our recommendations:
- Read the Quality Raters’ Guidelines and pay close attention to any examples given that are in similar verticals as you.
- Make sure that every medical claim made (and really any important fact) is supported by strong references from authoritative, trustworthy sources.
- Be clear and up front about who you are, who is responsible for the content on your site, your refund policies and monetization methods.
- Look at keywords that declined in rankings and see which of your competitors pages improved at the same time. The goal is to determine how they are doing a better job at meeting a searcher’s needs and whether you can change your content so that it would truly be the best choice.
- Make good use of descriptive headings in your articles.
- If your content is not written by someone who is a subject matter expert, you may need to find more authoritative authors. In many cases for medical content, it may be sufficient to have a medical reviewer alongside your author.
- Don’t overdo it with ads. If you have intrusive ads, consider removing them.
Of course, it is always recommended to perform a thorough technical review of a site that is not performing well. In our experience though, technical issues are rarely the cause of a traffic drop following a Google core update.